If during the training of a neural network you notice that some of the nodes have very low weights and do not contribute much to the model, what approach could you take to solve this?
L1 Regularization.
L2 Regularization.
Dropout.
Ensemble Learning.
If a neural network has 3 layers of sizes (2, 3, 1) and no bias units, how many weights do we have to train?
3
6
9
12
Which activation function is more likely to cause a vanishing gradient?
Sigmoid
Tanh
Relu
None of the above
You have designed a CNN architecture for a classification task, and you notice that your model is overfitting. Which is most likely to reduce overfitting?
Increase the number of convolutional layers
Decrease the size of the filters in each convolutional layer
Decrease the number of filters in each convolutional layer
Decrease the size of the kernel in each max pooling layer
What is the best way to ensure a CNN classifier is capable of detecting objects with various sizes and at different positions in image data?
Add more max pooling layers.
Add dropout, and remove 'relu' activation functions from all convolutional layers.
Expand the training and validation data by adding images containing objects from more categories.
Expand the existing training and validation data through augmentation.